Python Job: Data Engineer

Job added on

Company

Neospot Pty Ltd

Location

Docklands - Australia

Job type

Full-Time

Python Job Details

This is a full time opportunity with our client. a major Australian bank and AFR winner of best workplaces. The role is based in Melbourne.

About the role

As a Data Engineer, you will be comfortable being hands-on, as well as a coach to the team. The role exists within a fast-moving insights team working on building a data-driven digital channel.

The Data Engineer will also work with senior data science management and departments beyond the Data and Analytics department in analyzing and understanding data sources, participating in the design, and providing insights and guidance on database technology and data modelling best practices.

  • Design and build solutions that meet user/customer needs
  • Ability to build and maintain inclusive relationships with internal and external stakeholders and influence/negotiate change
  • Make sound and timely decisions and re-prioritising work in response to customer needs
  • Ability to challenge yourself and the business on the value that will be delivered to customers
  • Work in collaborative and inclusive teams to build innovative solutions and helps to guide and support less experienced team members in the Tribe
  • Proactive in learning and taking on additional tasks & analysis to assist in the achievement of team goals
  •  Design and deliver solutions on a variety of technologies that meet user needs
  • Strong understanding of the underlying technologies and the skill to build sustainable
  • engineering/solution development disciplines within your team
  • Understanding of solution development lifecycles and relevant delivery controls
  • Identify, monitor, and manage risks, issues, and dependencies, agreeing on appropriate risk
  • responses
  • Coordinating and guiding external teams to deliver data products to downstream consumers

About You

  • Ability to optimise data flows by building robust, fault-tolerant data pipeline that cleans,
  • transforms, and aggregates data into databases or data sources
  • Hands-on experience with Cloud Platforms (GCP preferred) and on-premises Enterprise Big Data Platform based on Cloudera and Hadoop eco-system
  • Extensive experience in designing and building data pipelines from data ingestion to consumption within a hybrid big data architecture, using Cloud Native products
  • Proficiency in Python and C# with effective SQL queries composing and prior experience with dbt
  • Experience with building microservices applications on Kubernetes
  • A strong understanding of software development techniques such as Object-Oriented design, Test-Driven Development, and Continuous Integration/Continuous Delivery
  • Hands-on skillset with orchestration tools such as Control-M and Airflow
  • Ability to build programs and scripts on Hadoop using Pig, Hive, and Spark
  • Building strong relationships and influence with stakeholders at all levels
  • Collaborative approach along with being an effective listener and presenter
  • Strong analytical and problem-solving skills
  • Prior experience with Data Life Cycle Management, Data Governance, and Data Quality Control
  • Basic understanding of Machine Learning techniques (supervised/unsupervised)

This critical role will pay an above market salary package.

Job Types: Full-time, Permanent, Fixed term
Contract length: 12 months

Salary: $950.00 per day

Benefits:

  • Work from home

Schedule:

  • Day shift
  • Flexible hours
  • Monday to Friday

Work Authorisation:

  • Australia (Preferred)

Work Location: Hybrid remote in Docklands, VIC 3008